8,209 research outputs found

    Analysis of the potentials of multi criteria decision analysis methods to conduct sustainability assessment

    Get PDF
    Sustainability assessments require the management of a wide variety of information types, parameters and uncertainties. Multi criteria decision analysis (MCDA) has been regarded as a suitable set of methods to perform sustainability evaluations as a result of its flexibility and the possibility of facilitating the dialogue between stakeholders, analysts and scientists. However, it has been reported that researchers do not usually properly define the reasons for choosing a certain MCDA method instead of another. Familiarity and affinity with a certain approach seem to be the drivers for the choice of a certain procedure. This review paper presents the performance of five MCDA methods (i.e. MAUT, AHP, PROMETHEE, ELECTRE and DRSA) in respect to ten crucial criteria that sustainability assessments tools should satisfy, among which are a life cycle perspective, thresholds and uncertainty management, software support and ease of use. The review shows that MAUT and AHP are fairly simple to understand and have good software support, but they are cognitively demanding for the decision makers, and can only embrace a weak sustainability perspective as trade-offs are the norm. Mixed information and uncertainty can be managed by all the methods, while robust results can only be obtained with MAUT. ELECTRE, PROMETHEE and DRSA are non-compensatory approaches which consent to use a strong sustainability concept, accept a variety of thresholds, but suffer from rank reversal. DRSA is less demanding in terms of preference elicitation, is very easy to understand and provides a straightforward set of decision rules expressed in the form of elementary “if … then …” conditions. Dedicated software is available for all the approaches with a medium to wide range of results capability representation. DRSA emerges as the easiest method, followed by AHP, PROMETHEE and MAUT, while ELECTRE is regarded as fairly difficult. Overall, the analysis has shown that most of the requirements are satisfied by the MCDA methods (although to different extents) with the exclusion of management of mixed data types and adoption of life cycle perspective which are covered by all the considered approaches

    Comparison of Magnetic Anomalies of Lithospheric Origin Measured by Satellite and Airborne Magnetometers over Western Canada

    Get PDF
    Crustal magnetic anomaly data from the OGO 2, 4 and 6 (Pogo) satellites are compared with upward-continued aeromagnetic data between 50 deg -85 deg N latitude and 220 deg - 260 deg E longitude. Agreement is good both in anomaly location and in amplitude, giving confidence that it is possible to proceed with the derivation and interpretation of satellite anomaly maps in all parts of the globe. The data contain a magnetic high over the Alpha ridge suggesting continental composition and a magnetic low over the southern Canada basin and northern Canadian Arctic islands (Sverdrup basin). The low in the Sverdrup basin corresponds to a region of high heat flow, suggesting a shallow Curie isotherm. A ridge of high field, with two distinct peaks in amplitude, is found over the northern portion of the platform deposits and a relative high is located in the central portion of the Churchill province. No features are present to indicate a magnetic boundary between Slave and Bear provinces, but a trend change is evident between Slave and Churchill provinces. South of 60 deg latitude a broad magnetic low is located over very thick (40-50 km) crust, interpreted to be a region of low magnetization

    We reap what we sew: perpetuating biblical illiteracy in new English religious studies exams and the proof text binary question

    Get PDF
    This article draws on three sources of evidence that together indicate hermeneutical weaknesses in exam courses on Christianity in English Religious Education (RE). It scrutinizes a single exam paper and an associated text book from a recent authorized course. It conceptually explores features of a new style of long Religious Studies (RS) exam question that is commonly set for the majority of students studying for a RS qualification at 15-16 years old. It combines these documentary sources with a focus group interview of teachers in the first year of teaching the new GCSE Religious Studies. The findings from the document analysis, conceptual analysis and focus group interview, together concur that there is a problem related to the use of fragmentary texts and the promotion of a particularly propositional conception of religion. These features are structured in by systemic elements. A small proportion of students follow text-based GCSE routes include a more detailed study of Biblical texts but the majority of 15-16-year-old students do not and so are exposed to this problem. These weaknesses could be ‘designed out’ of exams with smarter questions and mitigated against by curriculum content that specified the study of how texts are interpreted, as well as teacher expertise in the teaching and practice of hermeneutics

    The Cluster Distribution as a Test of Dark Matter Models. IV: Topology and Geometry

    Full text link
    We study the geometry and topology of the large-scale structure traced by galaxy clusters in numerical simulations of a box of side 320 h1h^{-1} Mpc, and compare them with available data on real clusters. The simulations we use are generated by the Zel'dovich approximation, using the same methods as we have used in the first three papers in this series. We consider the following models to see if there are measurable differences in the topology and geometry of the superclustering they produce: (i) the standard CDM model (SCDM); (ii) a CDM model with Ω0=0.2\Omega_0=0.2 (OCDM); (iii) a CDM model with a `tilted' power spectrum having n=0.7n=0.7 (TCDM); (iv) a CDM model with a very low Hubble constant, h=0.3h=0.3 (LOWH); (v) a model with mixed CDM and HDM (CHDM); (vi) a flat low-density CDM model with Ω0=0.2\Omega_0=0.2 and a non-zero cosmological Λ\Lambda term (Λ\LambdaCDM). We analyse these models using a variety of statistical tests based on the analysis of: (i) the Euler-Poincar\'{e} characteristic; (ii) percolation properties; (iii) the Minimal Spanning Tree construction. Taking all these tests together we find that the best fitting model is Λ\LambdaCDM and, indeed, the others do not appear to be consistent with the data. Our results demonstrate that despite their biased and extremely sparse sampling of the cosmological density field, it is possible to use clusters to probe subtle statistical diagnostics of models which go far beyond the low-order correlation functions usually applied to study superclustering.Comment: 17 pages, 7 postscript figures, uses mn.sty, MNRAS in pres

    Bias and Hierarchical Clustering

    Get PDF
    It is now well established that galaxies are biased tracers of the distribution of matter, although it is still not known what form this bias takes. In local bias models the propensity for a galaxy to form at a point depends only on the overall density of matter at that point. Hierarchical scaling arguments allow one to build a fully-specified model of the underlying distribution of matter and to explore the effects of local bias in the regime of strong clustering. Using a generating-function method developed by Bernardeau & Schaeffer (1992), we show that hierarchical models lead one directly to the conclusion that a local bias does not alter the shape of the galaxy correlation function relative to the matter correlation function on large scales. This provides an elegant extension of a result first obtained by Coles (1993) for Gaussian underlying fields and confirms the conclusions of Scherrer & Weinberg (1998) obtained using a different approach. We also argue that particularly dense regions in a hierarchical density field display a form of bias that is different from that obtained by selecting such peaks in Gaussian fields: they are themselves hierarchically distributed with scaling parameters Sp=p(p2)S_p=p^{(p-2)}. This kind of bias is also factorizable, thus in principle furnishing a simple test of this class of models.Comment: Latex, accepted for publication in ApJL; moderate revision

    Getting the Measure of the Flatness Problem

    Get PDF
    The problem of estimating cosmological parameters such as Ω\Omega from noisy or incomplete data is an example of an inverse problem and, as such, generally requires a probablistic approach. We adopt the Bayesian interpretation of probability for such problems and stress the connection between probability and information which this approach makes explicit. This connection is important even when information is ``minimal'' or, in other words, when we need to argue from a state of maximum ignorance. We use the transformation group method of Jaynes to assign minimally--informative prior probability measure for cosmological parameters in the simple example of a dust Friedman model, showing that the usual statements of the cosmological flatness problem are based on an inappropriate choice of prior. We further demonstrate that, in the framework of a classical cosmological model, there is no flatness problem.Comment: 11 pages, submitted to Classical and Quantum Gravity, Tex source file, no figur
    corecore